Improving MCMC, using efficient importance sampling
نویسندگان
چکیده
This paper develops a systematic Markov Chain Monte Carlo (MCMC) framework based upon E cient Importance Sampling (EIS) which can be used for the analysis of a wide range of econometric models involving integrals without an analytical solution. EIS is a simple, generic and yet accurate Monte-Carlo integration procedure based on sampling densities which are chosen to be global approximations to the integrand. By embedding EIS within MCMC procedures based on Metropolis-Hastings (MH) one can signi cantly improve their numerical properties, essentially by providing a fully automated selection of critical MCMC components such as auxiliary sampling densities, normalizing constants and starting values. The potential of this integrated MCMCEIS approach is illustrated with simple univariate integration problems and with the Bayesian posterior analysis of stochastic volatility models and stationary autoregressive processes.
منابع مشابه
Improving MCMC Using E cient Importance Sampling
This paper develops a systematic Markov Chain Monte Carlo (MCMC) framework based upon E cient Importance Sampling (EIS) which can be used for the analysis of a wide range of econometric models involving integrals without an analytical solution. EIS is a simple, generic and yet accurate Monte-Carlo integration procedure based on sampling densities which are chosen to be global approximations to ...
متن کاملRééchantillonnage de l’échelle dans les algorithmes MCMC pour les problèmes inverses bilinéaires
This article presents an efficient method for improving the behavior of the MCMC sampling algorithm involved in the resolution of bilinear inverse problems. Blind deconvolution and source separation are among the applications that benefit from this improvement. The proposed method addresses the scale ambiguity inherent to bilinear inverse problems. Solving this type of problem within a Bayesian...
متن کاملLearning Deep Generative Models with Doubly Stochastic MCMC
We present doubly stochastic gradient MCMC, a simple and generic method for (approximate) Bayesian inference of deep generative models in the collapsed continuous parameter space. At each MCMC sampling step, the algorithm randomly draws a minibatch of data samples to estimate the gradient of log-posterior and further estimates the intractable expectation over latent variables via a Gibbs sample...
متن کامل2 5 Ju n 20 15 Markov Interacting Importance Samplers
We introduce a new Markov chain Monte Carlo (MCMC) sampler called the Markov Interacting Importance Sampler (MIIS). The MIIS sampler uses conditional importance sampling (IS) approximations to jointly sample the current state of the Markov Chain and estimate conditional expectations, possibly by incorporating a full range of variance reduction techniques. We compute Rao-Blackwellized estimates ...
متن کاملAdaptive Metropolis-Hastings samplers for the Bayesian analysis of large linear Gaussian systems
This paper considers the implementation of efficient Bayesian computation for large linear Gaussian models containing many latent variables. A common approach is to implement a simple MCMC procedure such as the Gibbs sampler or data augmentation, but these methods are often unsatisfactory when the model is large. This motivates the need to develop other strategies for improving MCMC. This paper...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Computational Statistics & Data Analysis
دوره 53 شماره
صفحات -
تاریخ انتشار 2008